From 1 - 10 / 69
  • The aim of this project is to equip ANUGA with a storm surge capability in partnership with the Department of Planning Western Australia (DoP), take steps to validate the methodology and provide a case study to DoP in the form of a storm surge scenario for Bunbury. The developed capability will provide a mechanism whereby DoP can investigate mitigation options for a range of hydrodynamic hazards.

  • In 2008, the performance of 14 statistical and mathematical methods for spatial interpolation was compared using samples of seabed mud content across the Australian Exclusive Economic Zone (AEEZ), which indicated that machine learning methods are generally among the most accurate methods. In this study, we further test the performance of machine learning methods in combination with ordinary kriging (OK) and inverse distance squared (IDS). We aim to identify the most accurate methods for spatial interpolation of seabed mud content in three regions (i.e., N, NE and SW) in AEEZ using samples extracted from Geoscience Australia's Marine Samples Database (MARS). The performance of 18 methods (machine learning methods and their combinations with OK or IDS) is compared using a simulation experiment. The prediction accuracy changes with the methods, inclusion and exclusion of slope, search window size, model averaging and the study region. The combination of RF and OK (RFOK) and the combination of RF and IDS (RFIDS) are, on average, more accurate than the other methods based on the prediction accuracy and visual examination of prediction maps in all three regions when slope is included and when their searching widow size is 12 and 7, respectively. Averaging the predictions of these two most accurate methods could be an alternative for spatial interpolation. The methods identified in this study reduce the prediction error by up to 19% and their predictions depict the transitional zones between geomorphic features in comparison with the control. This study confirmed the effectiveness of combining machine learning methods with OK or IDS and produced an alternative source of methods for spatial interpolation. Procedures employed in this study for selecting the most accurate prediction methods provide guidance for future studies.

  • One of the important inputs to a probabilistic seismic hazard assessment is the expected rate at which earthquakes within the study region. The rate of earthquakes is a function of the rate at which the crust is being deformed, mostly by tectonic stresses. This paper will present two contrasting methods of estimating the strain rate at the scale of the Australian continent. The first method is based on statistically analysing the recently updated national earthquake catalogue, while the second uses a geodynamic model of the Australian plate and the forces that act upon it. For the first method, we show a couple of examples of the strain rates predicted across Australia using different statistical techniques. However no matter what method is used, the measurable seismic strain rates are typically in the range of 10-16s-1 to around 10-18s-1 depending on location. By contrast, the geodynamic model predicts a much more uniform strain rate of around 10-17s-1 across the continent. The level of uniformity of the true distribution of long term strain rate in Australia is likely to be somewhere between these two extremes. Neither estimate is consistent with the Australian plate being completely rigid and free from internal deformation (i.e. a strain rate of exactly zero). This paper will also give an overview of how this kind of work affects the national earthquake hazard map and how future high precision geodetic estimates of strain rate should help to reduce the uncertainty in this important parameter for probabilistic seismic hazard assessments.

  • In this study, we aim to identify the most appropriate methods for spatial interpolation of seabed sand content for the AEEZ using samples extracted on August 2010 from Geoscience Australia's Marine Samples Database. The predictive accuracy changes with methods, input secondary variables, model averaging, search window size and the study region but the choice of mtry. No single method performs best for all the tested scenarios. Of the 18 compared methods, RFIDS and RFOK are the most accurate methods in all three regions. Overall, of the 36 combinations of input secondary variables, methods and regions, RFIDS, 6RFIDS and RFOK were among the most accurate methods in all three regions. Model averaging further improved the prediction accuracy. The most accurate methods reduced the prediction error by up to 7%. RFOKRFIDS, with a search window size of 5, an mtry of 4 and more realistic predictions in comparison with the control, is recommended for predicting sand content across the AEEZ if a single method is required. This study provides suggestions and guidelines for improving the spatial interpolations of marine environmental data.

  • Geoscience Australia is supporting the exploration and development of offshore oil and gas resources and establishment of Australia's national representative system of marine protected areas through provision of spatial information about the physical and biological character of the seabed. Central to this approach is prediction of Australia's seabed biodiversity from spatially continuous data of physical seabed properties. However, information for these properties is usually collected at sparsely-distributed discrete locations, particularly in the deep ocean. Thus, methods for generating spatially continuous information from point samples become essential tools. Such methods are, however, often data- or even variable- specific and it is difficult to select an appropriate method for any given dataset. Improving the accuracy of these physical data for biodiversity prediction, by searching for the most robust spatial interpolation methods to predict physical seabed properties, is essential to better inform resource management practises. In this regard, we conducted a simulation experiment to compare the performance of statistical and mathematical methods for spatial interpolation using samples of seabed mud content across the Australian margin. Five factors that affect the accuracy of spatial interpolation were considered: 1) region; 2) statistical method; 3) sample density; 4) searching neighbourhood; and 5) sample stratification by geomorphic provinces. Bathymetry, distance-to-coast and slope were used as secondary variables. In this study, we only report the results of the comparison of 14 methods (37 sub-methods) using samples of seabed mud content with five levels of sample density across the southwest Australian margin. The results of the simulation experiment can be applied to spatial data modelling of various physical parameters in different disciplines and have application to a variety of resource management applications for Australia's marine region.

  • Keynote presentation to cover * the background to tsunami modelling in Australia * what the modelling showed * why the modelling is important to emergency managers * the importance of partnerships * future challenges

  • The major tsunami disaster in the Indian Ocean in 2004, and the subsequent large events off the south coast of Indonesia and in the Solomon Islands, have dramatically raised awareness of the possibility of potentially damaging tsunamis in the Australian region. Since the 2004 Indian Ocean Tsunami (IOT), a number of emergency management agencies have worked with Geoscience Australia to help to develop an understanding of the tsunami hazard faced by their jurisdictions. Here I will discuss both the major tsunamis over the last few years in the region and the recent efforts of Geoscience Australia and others to try to estimate the likelihood of such events in the future. Since 2004, a range of probabilistic and scenario based hazard assessments have been completed through collaborative projects between Geoscience Australia and other agencies in Australia and the region. These collaborations have resulted in some of the first ever probabilistic tsunami hazard assessments to be completed for Australia and for a wide range of other countries in the southwest Pacific and Indian Oceans. These assessments not only estimate the amplitude of a tsunami that could reach the coast but also its probability. The assessments allow crucial questions from emergency managers (such as 'Just how often do large tsunamis reach our coasts?) to be quantitatively addressed. In addition, they also provide a mechanism to prioritise communities for more detailed risk assessments. This work allows emergency managers to base their decisions on the best available science and data for their jurisdiction instead of relying solely on intuition.

  • Abstract for the final talk in the earthquake hazard map session planned for the 2011 AEES meeting.

  • For the past decade, staff at Geoscience Australia (GA), Australia's Commonwealth Government geoscientific agency, have routinely performed 3D gravity and magnetic modelling as part of our geoscience investigations. For this work, we have used a number of different commercial software programs, all of which have been based on a Cartesian mesh spatial framework. These programs have come as executable files that were compiled to operate in a Windows environment on single core personal computers (PCs). In recent times, a number of factors have caused us to consider a new approach to this modelling work. The drivers for change include; 1) models with very large lateral extents where the effects of Earth curvature are a consideration, 2) a desire to ensure that the modelling of separate regions is carried out in a consistent and managed fashion, 3) migration of scientific computing to off-site High Performance Computing (HPC) facilities, and 4) development of virtual globe environments for integration and visualization of 3D spatial objects. Our response has been to do the following; 1) form a collaborative partnership with researchers at the Colorado School of Mines (CSM) and the China University of Geosciences (CUG) to develop software for spherical mesh modelling of gravity and magnetic data, 2) to ensure that we had access to the source code for any modelling software so that we could customize and compile it for the HPC environment of our choosing, 3) to learn about the different types of HPC environments, 4) to investigate which type of HPC environment would have the optimum mix of availability to us, compute resources, and architecture, and 5) to promote the in-house development of a virtual globe application that we make freely available, built on an open-source Eclipse Rich Client Platform (RCP) called `EarthSci' that in turn makes use of the NASA World Wind Software Development Kit (SDK) as the globe rendering engine.

  • Effective disaster risk reduction is founded on knowledge of the underlying risk. While methods and tools for assessing risk from specific hazards or to individual assets are generally well developed, our ability to holistically assess risk to a community across a range of hazards and elements at risk remains limited. Developing a holistic view of risk requires interdisciplinary collaboration amongst a wide range of hazard scientists, engineers and social scientists, as well as engagement of a range of stakeholders. This paper explores these challenges and explores some of the common and contrasting issues sampled from a range of applications addressing earthquake, tsunami, volcano, severe wind, flood, and sea-level rise from projects in Australia, Indonesia and the Philippines. Key issues range from the availability of appropriate risk assessment tools and data, to the ability of communities to implement appropriate risk reduction measures. Quantifying risk requires information on the hazard, the exposure and the vulnerability. Often the knowledge of the hazard is reasonably well constrained, but exposure information (e.g., people and their assets) and measures of vulnerability (i.e., susceptibility to injury or damage) are inconsistent or unavailable. In order to fill these gaps, Geoscience Australia has developed computational models and tools which are open and freely available. As the knowledge gaps become smaller, the need is growing to go beyond the quantification of risk to the provision of tools to aid in selecting the most appropriate risk reduction strategies (e.g., evacuation plans, building retrofits, insurance, or land use) to build community resilience.